From differential equation solvers to accelerated first-order methods for convex optimization

نویسندگان

چکیده

Abstract Convergence analysis of accelerated first-order methods for convex optimization problems are developed from the point view ordinary differential equation solvers. A new dynamical system, called Nesterov gradient (NAG) flow, is derived connection between acceleration mechanism and -stability ODE solvers, exponential decay a tailored Lyapunov function along with solution trajectory proved. Numerical discretizations NAG flow then considered convergence rates established via discrete function. The proposed solver approach can not only cover existing methods, such as FISTA, Güler’s proximal algorithm Nesterov’s method, but also produce algorithms composite that possess rates. Both strongly cases handled in unified way our approach.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

First-order Methods for Geodesically Convex Optimization

Geodesic convexity generalizes the notion of (vector space) convexity to nonlinear metric spaces. But unlike convex optimization, geodesically convex (g-convex) optimization is much less developed. In this paper we contribute to the understanding of g-convex optimization by developing iteration complexity analysis for several first-order algorithms on Hadamard manifolds. Specifically, we prove ...

متن کامل

An adaptive accelerated first-order method for convex optimization

In this paper, we present a new accelerated variant of Nesterov’s method for solving a class of convex optimization problems, in which certain acceleration parameters are adaptively (and aggressively) chosen so as to: preserve the theoretical iteration-complexity of the original method, and; substantially improve its practical performance in comparison to the other existing variants. Computatio...

متن کامل

Accelerated first-order methods for large-scale convex minimization

This paper discusses several (sub)gradient methods attaining the optimal complexity for smooth problems with Lipschitz continuous gradients, nonsmooth problems with bounded variation of subgradients, weakly smooth problems with Hölder continuous gradients. The proposed schemes are optimal for smooth strongly convex problems with Lipschitz continuous gradients and optimal up to a logarithmic fac...

متن کامل

Fast First-Order Methods for Composite Convex Optimization with Backtracking

We propose new versions of accelerated first order methods for convex composite optimization, where the prox parameter is allowed to increase from one iteration to the next. In particular we show that a full backtracking strategy can be used within the FISTA [1] and FALM algorithms [7] while preserving their worst-case iteration complexities of O( √ L(f)/ ). In the original versions of FISTA an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Programming

سال: 2021

ISSN: ['0025-5610', '1436-4646']

DOI: https://doi.org/10.1007/s10107-021-01713-3